Serveur d'exploration sur Caltech

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Minimizing memory loss in learning a new environment

Identifieur interne : 000441 ( Main/Exploration ); précédent : 000440; suivant : 000442

Minimizing memory loss in learning a new environment

Auteurs : Khalid Al-Mashouq [Arabie saoudite] ; Yaser Abu-Mostafa [États-Unis] ; Khaled Al-Ghoneim [Arabie saoudite]

Source :

RBID : ISTEX:0B6DEDB00E7CB804CDFFDE88E0F6B0191ACD398E

Abstract

Human and other living species can learn new concepts without losing the old ones. On the other hand, artificial neural networks tend to “forget” old concepts. In this paper, we present three methods to minimize the loss of the old information. These methods are analyzed and compared for the linear model. In particular, a method called network sampling is shown to be optimal under certain condition on the sampled data distribution. We also show how to apply these methods in the nonlinear models.

Url:
DOI: 10.1016/S0925-2312(01)00400-3


Affiliations:


Links toward previous steps (curation, corpus...)


Le document en format XML

<record>
<TEI wicri:istexFullTextTei="biblStruct">
<teiHeader>
<fileDesc>
<titleStmt>
<title>Minimizing memory loss in learning a new environment</title>
<author>
<name sortKey="Al Mashouq, Khalid" sort="Al Mashouq, Khalid" uniqKey="Al Mashouq K" first="Khalid" last="Al-Mashouq">Khalid Al-Mashouq</name>
</author>
<author>
<name sortKey="Abu Mostafa, Yaser" sort="Abu Mostafa, Yaser" uniqKey="Abu Mostafa Y" first="Yaser" last="Abu-Mostafa">Yaser Abu-Mostafa</name>
</author>
<author>
<name sortKey="Al Ghoneim, Khaled" sort="Al Ghoneim, Khaled" uniqKey="Al Ghoneim K" first="Khaled" last="Al-Ghoneim">Khaled Al-Ghoneim</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">ISTEX</idno>
<idno type="RBID">ISTEX:0B6DEDB00E7CB804CDFFDE88E0F6B0191ACD398E</idno>
<date when="2001" year="2001">2001</date>
<idno type="doi">10.1016/S0925-2312(01)00400-3</idno>
<idno type="url">https://api.istex.fr/document/0B6DEDB00E7CB804CDFFDE88E0F6B0191ACD398E/fulltext/pdf</idno>
<idno type="wicri:Area/Main/Corpus">000496</idno>
<idno type="wicri:Area/Main/Curation">000496</idno>
<idno type="wicri:Area/Main/Exploration">000441</idno>
<idno type="wicri:explorRef" wicri:stream="Main" wicri:step="Exploration">000441</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title level="a">Minimizing memory loss in learning a new environment</title>
<author>
<name sortKey="Al Mashouq, Khalid" sort="Al Mashouq, Khalid" uniqKey="Al Mashouq K" first="Khalid" last="Al-Mashouq">Khalid Al-Mashouq</name>
<affiliation wicri:level="1">
<country wicri:rule="url">Arabie saoudite</country>
</affiliation>
<affiliation wicri:level="1">
<country xml:lang="fr">Arabie saoudite</country>
<wicri:regionArea>EE Department, King Saud University, P.O. Box 800, Riyadh 11421</wicri:regionArea>
<wicri:noRegion>Riyadh 11421</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Abu Mostafa, Yaser" sort="Abu Mostafa, Yaser" uniqKey="Abu Mostafa Y" first="Yaser" last="Abu-Mostafa">Yaser Abu-Mostafa</name>
<affiliation wicri:level="1">
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Caltech, Pasadena, CA</wicri:regionArea>
<wicri:noRegion>CA</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Al Ghoneim, Khaled" sort="Al Ghoneim, Khaled" uniqKey="Al Ghoneim K" first="Khaled" last="Al-Ghoneim">Khaled Al-Ghoneim</name>
<affiliation wicri:level="1">
<country xml:lang="fr">Arabie saoudite</country>
<wicri:regionArea>EE Department, King Saud University, P.O. Box 800, Riyadh 11421</wicri:regionArea>
<wicri:noRegion>Riyadh 11421</wicri:noRegion>
</affiliation>
</author>
</analytic>
<monogr></monogr>
<series>
<title level="j">Neurocomputing</title>
<title level="j" type="abbrev">NEUCOM</title>
<idno type="ISSN">0925-2312</idno>
<imprint>
<publisher>ELSEVIER</publisher>
<date type="published" when="2001">2001</date>
<biblScope unit="volume">38–40</biblScope>
<biblScope unit="supplement">C</biblScope>
<biblScope unit="page" from="1051">1051</biblScope>
<biblScope unit="page" to="1057">1057</biblScope>
</imprint>
<idno type="ISSN">0925-2312</idno>
</series>
<idno type="istex">0B6DEDB00E7CB804CDFFDE88E0F6B0191ACD398E</idno>
<idno type="DOI">10.1016/S0925-2312(01)00400-3</idno>
<idno type="PII">S0925-2312(01)00400-3</idno>
</biblStruct>
</sourceDesc>
<seriesStmt>
<idno type="ISSN">0925-2312</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass></textClass>
<langUsage>
<language ident="en">en</language>
</langUsage>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Human and other living species can learn new concepts without losing the old ones. On the other hand, artificial neural networks tend to “forget” old concepts. In this paper, we present three methods to minimize the loss of the old information. These methods are analyzed and compared for the linear model. In particular, a method called network sampling is shown to be optimal under certain condition on the sampled data distribution. We also show how to apply these methods in the nonlinear models.</div>
</front>
</TEI>
<affiliations>
<list>
<country>
<li>Arabie saoudite</li>
<li>États-Unis</li>
</country>
</list>
<tree>
<country name="Arabie saoudite">
<noRegion>
<name sortKey="Al Mashouq, Khalid" sort="Al Mashouq, Khalid" uniqKey="Al Mashouq K" first="Khalid" last="Al-Mashouq">Khalid Al-Mashouq</name>
</noRegion>
<name sortKey="Al Ghoneim, Khaled" sort="Al Ghoneim, Khaled" uniqKey="Al Ghoneim K" first="Khaled" last="Al-Ghoneim">Khaled Al-Ghoneim</name>
<name sortKey="Al Mashouq, Khalid" sort="Al Mashouq, Khalid" uniqKey="Al Mashouq K" first="Khalid" last="Al-Mashouq">Khalid Al-Mashouq</name>
</country>
<country name="États-Unis">
<noRegion>
<name sortKey="Abu Mostafa, Yaser" sort="Abu Mostafa, Yaser" uniqKey="Abu Mostafa Y" first="Yaser" last="Abu-Mostafa">Yaser Abu-Mostafa</name>
</noRegion>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Amerique/explor/CaltechV1/Data/Main/Exploration
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000441 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd -nk 000441 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Amerique
   |area=    CaltechV1
   |flux=    Main
   |étape=   Exploration
   |type=    RBID
   |clé=     ISTEX:0B6DEDB00E7CB804CDFFDE88E0F6B0191ACD398E
   |texte=   Minimizing memory loss in learning a new environment
}}

Wicri

This area was generated with Dilib version V0.6.32.
Data generation: Sat Nov 11 11:37:59 2017. Site generation: Mon Feb 12 16:27:53 2024